Skip to content

Reduce memory cost of liquid properties#69

Closed
jthorton wants to merge 2 commits intoopenforcefield:mainfrom
jthorton:memory
Closed

Reduce memory cost of liquid properties#69
jthorton wants to merge 2 commits intoopenforcefield:mainfrom
jthorton:memory

Conversation

@jthorton
Copy link
Copy Markdown
Collaborator

Description

Try to reduce the memory cost of running a batch of liquid properties by running one by one and summing the gradients. The downside is that we lose any savings in the simulation deduplication as the planner does not know about other simulations which have been run.

Status

  • Ready to go

@lilyminium
Copy link
Copy Markdown
Contributor

@jthorton are you still keen on this PR? If so the tests are failing on formatting -- would you mind having a look at that?

@lilyminium
Copy link
Copy Markdown
Contributor

Pinging @fjclark and @JMorado too in case you would find this useful -- otherwise I'll plan to close this PR, as we are planning to build our own distributed compute system in dimsim.

@JMorado
Copy link
Copy Markdown

JMorado commented Mar 23, 2026

I’m not planning on using this PR for now. I think it would be useful in the future to have a wider range of closures that are more memory-efficient. For example, I’m currently using the energies target, and using a default_closure that resorts to its predict function processes one entry at a time (an entry can hold multiple configs). This is fine for a default_closure and possibly the most performant option for small systems, but for larger systems it is virtually impossible to hold the complete graph for an entry in memory without running into OOM issues. My solution has been to code bespoke closures outside descent, which works just fine, but we might want to discuss whether it would be useful to provide some memory-efficient closures within the package itself.

@lilyminium
Copy link
Copy Markdown
Contributor

Ok, thanks @JMorado! In that case I'll close this PR for now and we can reopen the discussion when memory becomes more of an issue.

@lilyminium lilyminium closed this Mar 24, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants